1,251 research outputs found

    The Existence and Asymptotic Properties of a Backfitting Projection Algorithm Under Weak Conditions

    Get PDF
    We derive the asymptotic distribution of a new backfitting procedure for estimating the closest additive approximation to a nonparametric regression function. The procedure employs a recent projection interpretation of popular kernel estimators provided by Mammen et al. (1997), and the asymptotic theory of our estimators is derived using the theory of additive projections reviewed in Bickel et al. (1995). Our procedure achieves the same bias and variance as the oracle estimator based on knowing the other components, and in this sense improves on the method analyzed in Opsomer and Ruppert (1997). We provide 'high level' conditions independent of the sampling scheme. We then verify that these conditions are satisfied in a time series autoregression under weak conditions.Additive models, alternating projections, backfitting, kernel smoothing, local polynomials, nonparametric regression

    The existence and asymptotic properties of a backfitting projection algorithm under weak conditions.

    Get PDF
    We derive the asymptotic distribution of a new backfitting procedure for estimating the closest additive approximation to a nonparametric regression function. The procedure employs a recent projection interpretation of popular kernel estimators provided by Mammen, Marron, Turlach and Wand and the asymptotic theory of our estimators is derived using the theory of additive projections reviewed in Bickel, Klaassen, Ritov and Wellner. Our procedure achieves the same bias and variance as the oracle estimator based on knowing the other components, and in this sense improves on the method analyzed in Opsomer and Ruppert. We provide ‘‘high level’’ conditions independent of the sampling scheme. We then verify that these conditions are satisfied in a regression and a time series autoregression under weak conditions.

    Estimating Yield Curves by Kernel Smoothing Methods

    Get PDF
    We introduce a new method for the estimation of discount functions, yield curves and forward curves for coupon bonds. Our approach is nonparametric and does not assume a particular functional form for the discount function although we do show how to impose various important restrictions in the estimation. Our method is based on kernel smoothing and is defined as the minimum of some localized population moment condition. The solution to the sample problem is not explicit and our estimation procedure is iterative, rather like the backfitting method of estimating additive nonparametric models. We establish the asymptotic normality of our methods using the asymptotic representation of our estimator as an infinite series with declining coefficients. The rate of convergence is standard for one dimensional nonparametric regression.Coupon bonds; forward curve; Hilbert space; local linear; nonparametric regression; yield curve

    Here Today Gone Tomorrow: The Timing of Contracts for Jurisdiction and Venue Under 28 U.S.C. 1391

    Get PDF

    Testing Parametric versus Semiparametric Modelling in Generalized Linear Models

    Get PDF
    We consider a generalized partially linear model E(Y|X,T) = G{X'b + m(T)} where G is a known function, b is an unknown parameter vector, and m is an unknown function.The paper introduces a test statistic which allows to decide between a parametric and a semiparametric model: (i) m is linear, i.e. m(t) = t'g for a parameter vector g, (ii) m is a smooth (nonlinear) function.Under linearity (i) it is shown that the test statistic is asymptotically normal. Moreover, for the case of binary responses, it is proved that the bootstrap works asymptotically.Simulations suggest that (in small samples) bootstrap outperforms the calculation of critical values from the normal approximation.The practical performance of the test is shown in applications to data on East--West German migration and credit scoring.linear models
    corecore